AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
BF16 Quantized Inference

# BF16 Quantized Inference

Deepseek V3 0324 BF16
MIT
DeepSeek-V3-0324 is a BF16 version large language model released by DeepSeek AI, suitable for quantization and inference on GPUs that do not support FP8.
Large Language Model Transformers
D
ModelCloud
397
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase